Tutorials fix with set_all_parameters#113
Tutorials fix with set_all_parameters#113gmuraru wants to merge 3 commits intofacebookresearch:mainfrom
Conversation
| self.fc2.bias.data.fill_(value) | ||
| self.fc3.weight.data.fill_(value) | ||
| self.fc3.bias.data.fill_(value) | ||
| for p in self.parameters(): |
There was a problem hiding this comment.
Q: Is there any reason why data.fill_ is used?
There was a problem hiding this comment.
As far as I'm aware, torch.nn.init.constant_(param, value) is the same as param.data.fill_(value).
There was a problem hiding this comment.
They should do the same thing but changed it to constant_ because of an issue indicated in this PR
| @@ -302,9 +298,8 @@ def forward(self, input): | |||
| out = self.nested(out) | |||
|
|
|||
| def set_all_parameters(self, value): | |||
There was a problem hiding this comment.
Q: Is there any plan to have a method like this added directly in PyTorch for nn.Module?
There was a problem hiding this comment.
I don't believe there is any plan to do this. This function isn't defined in CrypTen either - it is defined in a unit test as a helper function. In general, I think it is left to the user to initialize (or otherwise set) parameter values.
|
Could you comment on what exactly was failing in the tutorial? We haven't reproduced failures on our end. |
For Tutorial 4 it was failing at cell 4 -- when using the The stack trace is the following: |
|
Maybe this is failing only on the public repo? (thank @LaRiffle) |
Types of changes
Motivation and Context / Related issue
Some tutorials cells were failing because
set_all_parametersmethod does not exist for the modules from PyTorch.Also, iterate aver all parameters in a more generic way and set the value using
nn.init.constant_then using thedata_attribute. (thanks @youben11 and @LaRiffle)How Has This Been Tested (if it applies)
The tests are passing (WIP)
Checklist